skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Malefyt, Amanda"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Problem solving is a signature skill of engineers. Here, problem solving is employed when students apply course concepts to reverse engineer YouTube videos and solve new student-written, homework-style problems (YouTube problems). Replacing textbook problems with YouTube problems, this research focuses on examining the rigor of YouTube problems as well as students’ problem-solving skills on textbook and YouTube problems. A quasi-experimental, treatment/control group design was employed, and data was collected and evaluated using multiple measurement instruments. First, rigor of homework problems was examined using the NASA Task Load Index. Also, problem solving was assessed using a previously-developed rubric called PROCESS: Problem definition, Representing the problem, Organizing the information, Calculations, Evaluating the solution, Solution communication, and Self-assessment. PROCESS was modified to independently measure completeness and accuracy of student responses, as well as identify errors committed in material and energy balances. In the treatment group, students were assigned ten textbook problems and nine YouTube problems. While the control group obtained higher PROCESS scores at the beginning of the study, both groups exhibited similar problem-solving skills near the end. Also, the rigor of student-written YouTube problems was similar to textbook problems related to the same course concepts. 
    more » « less
  2. This evidence-based practices paper discusses the method employed in validating the use of a project modified version of the PROCESS tool (Grigg, Van Dyken, Benson, & Morkos, 2013) for measuring student problem solving skills. The PROCESS tool allows raters to score students’ ability in the domains of Problem definition, Representing the problem, Organizing information, Calculations, Evaluating the solution, Solution communication, and Self-assessment. Specifically, this research compares student performance on solving traditional textbook problems with novel, student-generated learning activities (i.e. reverse engineering videos in order to then create their own homework problem and solution). The use of student-generated learning activities to assess student problem solving skills has theoretical underpinning in Felder’s (1987) work of “creating creative engineers,” as well as the need to develop students’ abilities to transfer learning and solve problems in a variety of real world settings. In this study, four raters used the PROCESS tool to score the performance of 70 students randomly selected from two undergraduate chemical engineering cohorts at two Midwest universities. Students from both cohorts solved 12 traditional textbook style problems and students from the second cohort solved an additional nine student-generated video problems. Any large scale assessment where multiple raters use a rating tool requires the investigation of several aspects of validity. The many-facets Rasch measurement model (MFRM; Linacre, 1989) has the psychometric properties to determine if there are any characteristics other than “student problem solving skills” that influence the scores assigned, such as rater bias, problem difficulty, or student demographics. Before implementing the full rating plan, MFRM was used to examine how raters interacted with the six items on the modified PROCESS tool to score a random selection of 20 students’ performance in solving one problem. An external evaluator led “inter-rater reliability” meetings where raters deliberated rationale for their ratings and differences were resolved by recourse to Pretz, et al.’s (2003) problem-solving cycle that informed the development of the PROCESS tool. To test the new understandings of the PROCESS tool, raters were assigned to score one new problem from a different randomly selected group of six students. Those results were then analyzed in the same manner as before. This iterative process resulted in substantial increases in reliability, which can be attributed to increased confidence that raters were operating with common definitions of the items on the PROCESS tool and rating with consistent and comparable severity. This presentation will include examples of the student-generated problems and a discussion of common discrepancies and solutions to the raters’ initial use of the PROCESS tool. Findings as well as the adapted PROCESS tool used in this study can be useful to engineering educators and engineering education researchers. 
    more » « less
  3. Homework problems from many textbooks have solutions manuals on the Internet. Students find solutions manuals and focus on getting the right answer put little or no effort on learning. Problem solving is a signature skill of engineers that is in high demand across industries. Here, problem solving is employed when students apply course concepts to reverse engineer YouTube videos and create new homework-quality problem statements and solutions. This research seeks to examine the rigor of YouTube problems and the effect of YouTube problems on the problem-solving skills of students. A quasi-experimental, treatment/control group design is employed and data is collected and evaluated using a variety of tools. The rigor of YouTube problems is examined using the NASA Task Load Index (TLX). Problem solving skill are examined using Problem definition, representing the problem, organizing information, calculations, evaluating the solution, solution communication, and self-assessment (PROCESS) and other tools. 
    more » « less